101 research outputs found

    A Probabilistic Peeling Decoder to Efficiently Analyze Generalized LDPC Codes Over the BEC

    Get PDF
    In this paper, we analyze the tradeoff between coding rate and asymptotic performance of a class of generalized low-density parity-check (GLDPC) codes constructed by including a certain fraction of generalized constraint (GC) nodes in the graph. The rate of the GLDPC ensemble is bounded using classical results on linear block codes, namely, Hamming bound and Varshamov bound. We also study the impact of the decoding method used at GC nodes. To incorporate both bounded-distance (BD) and maximum likelihood (ML) decoding at GC nodes into our analysis without resorting on multi-edge type of degree distributions (DDs), we propose the probabilistic peeling decoding (P-PD) algorithm, which models the decoding step at every GC node as an instance of a Bernoulli random variable with a successful decoding probability that depends on both the GC block code and its decoding algorithm. The P-PD asymptotic performance over the BEC can be efficiently predicted using standard techniques for LDPC codes such as density evolution (DE) or the differential equation method. Furthermore, for a class of GLDPC ensembles, we demonstrate that the simulated P-PD performance accurately predicts the actual performance of the GLPDC code under ML decoding at GC nodes. We illustrate our analysis for GLDPC code ensembles with regular and irregular DDs. In all cases, we show that a large fraction of GC nodes is required to reduce the original gap to capacity, but the optimal fraction is strictly smaller than one. We then consider techniques to further reduce the gap to capacity by means of random puncturing, and the inclusion of a certain fraction of generalized variable nodes in the graph.This work was supported in part by the Spanish Ministerio de Economía y Competitividad and the Agencia Española de Investigación under Grant TEC2016-78434-C3-3-R (AEI/FEDER, EU) and in part by the Comunidad de Madrid in Spain under Grant S2103/ICE-2845, Grant IND2017/TIC-7618, Grant IND2018/TIC-9649, and Grant Y2018/TCS-4705. P. M. Olmos was further supported by the Spanish Ministerio de Economía y Competitividad under Grant IJCI-2014-19150. T. Koch was further supported by the European Research Council (ERC) through the European Union’s Horizon 2020 research and innovation programme under Grant 714161, by the 7th European Union Framework Programme under Grant 333680, and by the Spanish Ministerio de Economía y Competitividad under Grant TEC2013- 41718-R and Grant RYC-2014-16332

    On LDPC Code Ensembles with Generalized Constraints

    Get PDF
    Proceeding of: 2017 IEEE International Symposium on Information Theory, Aachen, Germany, 25-30 June, 2017In this paper, we analyze the tradeoff between coding rate and asymptotic performance of a class of generalized low-density parity-check (GLDPC) codes constructed by including a certain fraction of generalized constraint (GC) nodes in the graph. The rate of the GLDPC ensemble is bounded using classical results on linear block codes, namely Hamming bound and Varshamov bound. We also study the impact of the decoding method used at GC nodes. To incorporate both bounded-distance (BD) and Maximum Likelihood (ML) decoding at GC nodes into our analysis without having to resort on multi-edge type of degree distributions (DDs), we propose the probabilistic peeling decoder (P-PD) algorithm, which models the decoding step at every GC node as an instance of a Bernoulli random variable with a success probability that depends on the GC block code and its decoding algorithm. The P-PD asymptotic performance over the BEC can be efficiently predicted using standard techniques for LDPC codes such as density evolution (DE) or the differential equation method. Furthermore, for a class of GLDPC ensembles, we demonstrate that the simulated P-PD performance accurately predicts the actual performance of the GLPDC code. We illustrate our analysis for GLDPC code ensembles using (2, 6) and (2,15) base DDs. In all cases, we show that a large fraction of GC nodes is required to reduce the original gap to capacity.This work has been funded in part by the Spanish Ministerio de Economía y Competitividad and the Agencia Española de Investigación under Grant TEC2016-78434-C3-3-R (AEI/FEDER, EU) and by the Comunidad de Madrid in Spain under Grant S2103/ICE-2845. T. Koch has further received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement number 714161), from the 7th European Union Framework Programme under Grant 333680, and from the Spanish Ministerio de Economía y Competitividad under Grants TEC2013-41718-R and RYC-2014-16332. Pablo M. Olmos has further received funding from the Spanish Ministerio de Economía y Competitividad under Grant IJCI-2014-19150

    Knowledge transfer activities in Humanities and Social Sciences: which determinants explain research group interactions with non-academic agents?

    Get PDF
    Trabajo presentado a la DIME-DRUID Academy Winter Conference: "Economics and Management of Innovation, Technology and Organizations", celebrada en Aalborg (Dinamarca) del 20 al 22 de enero de 2011.In the current society, universities and research centers have acquired an important role as agents responsible for knowledge transfer (KT) to the non-academic environment (OCDE 1996). The different ways in which these collaborations take place have been the subject of many conceptual (Molas-Gallart et al. 2002) and empirical studies (D'Este and Patel 2007; Landry et al. 2007) in recent years. The aim of this exploratory study paper is to contribute to KT literature from an area of study generally neglected, humanities and social sciences (HSS), and from a unit analysis perspective that have received less attention: the research group. Thus, the questions addressed in this study are: what are the main activities of KT used by HSS research groups to collaborate with non-academic agents? Do group characteristics or group’ leader profile influence the group’ engagement in a specific knowledge transfer activity? Data for this study has been gathered through questionnaires, interviews and databases for a sample made up of 79 research groups (80% of the population) belonging to the HSS area of the Spanish Council for Scientific Research (CSIC). Descriptive and multivariate analyses have been conducted. Results indicate that HSS research groups are very active in some KT activities such as technical advice, consultancy and contract research, whereas their involvement in personal mobility activities is low. Logistic regression analysis shows that the likelihood that research groups engage in any KT activities is not explained by the same factors. However, we obtain evidence showing that there is a common variable positively related with the engagement of HSS research groups for almost all the different activities analyzed: the focus on the social utility of the research.The study benefited from financial support from the Spanish National R&D Plan (Ref.: SEJ2005-24033-E) and the Valencian Regional Government (Ref.: GV06/225).Peer reviewe

    Medical data wrangling with sequential variational autoencoders

    Get PDF
    Medical data sets are usually corrupted by noise and missing data. These missing patterns are commonly assumed to be completely random, but in medical scenarios, the reality is that these patterns occur in bursts due to sensors that are off for some time or data collected in a misaligned uneven fashion, among other causes. This paper proposes to model medical data records with heterogeneous data types and bursty missing data using sequential variational autoencoders (VAEs). In particular, we propose a new methodology, the Shi-VAE, which extends the capabilities of VAEs to sequential streams of data with missing observations. We compare our model against state-of-theart solutions in an intensive care unit database (ICU) and a dataset of passive human monitoring. Furthermore, we find that standard error metrics such as RMSE are not conclusive enough to assess temporal models and include in our analysis the cross-correlation between the ground truth nd the imputed signal. We show that Shi-VAE achieves the best performance in terms of using both metrics, with lower computational complexity than the GP-VAE model, which is the state-of-the-art method for medical records.This work was supported in part by Spanish Government MCI under Grants TEC2017-92552-EXP and RTI2018-099655-B-100, in part by Comunidad de Madrid under Grants IND2017/TIC-7618, IND2018/TIC-9649, IND2020/TIC-17372, and Y2018/TCS-4705, in part by BBVA Foundation under the Deep-DARWiN Project, and in part by the European Union (FEDER) and the European Research Council (ERC) through the European Union's Horizon 2020 research and innovation program under Grant 714161

    Probabilistic Time of Arrival Localization

    Get PDF
    In this letter, we take a new approach for time of arrival geo-localization. We show that the main sources of error in metropolitan areas are due to environmental imperfections that bias our solutions, and that we can rely on a probabilistic model to learn and compensate for them. The resulting localization error is validated using measurements from a live LTE cellular network to be less than 10 meters, representing an order-of-magnitude improvement

    Continuous Transmission of Spatially Coupled LDPC Code Chains

    Get PDF
    We propose a novel encoding/transmission scheme called continuous chain (CC) transmission that is able to improve the finite-length performance of a system using spatially coupled low-density parity-check (SC-LDPC) codes. In CC transmission, instead of transmitting a sequence of independent code words from a terminated SC-LDPC code chain, we connect multiple chains in a layered format, where encoding, transmission, and decoding are performed in a continuous fashion. The connections between chains are created at specific points, chosen to improve the finite-length performance of the code structure under iterative decoding. We describe the design of CC schemes for different SC-LDPC code ensembles constructed from protographs: a (J,K) -regular SC-LDPC code chain, a spatially coupled repeat-accumulate (SC-RA) code, and a spatially coupled accumulate-repeat-jagged-accumulate (SC-ARJA) code. In all cases, significant performance improvements are reported and it is shown that using CC transmission only requires a small increase in decoding complexity and decoding delay with respect to a system employing a single SC-LDPC code chain for transmission.This material is based upon work supported in part by the National Science Foundation under Grant Nos. CCF-1161754 and CCSS-1710920, in part by NSERC Canada, and in part by the Spanish Ministry of Economy and Competitiveness and the Spanish National Research Agency under grants TEC2016-78434-C3-3-R (AEI/FEDER, EU) and Juan de la Cierva Fellowship IJCI-2014-19150

    Readout circuit with improved sensitivity for contactless LC sensing tags

    Get PDF
    In this work we present a novel technique to estimate the resonance frequency of LC chipless tags (inductor-capacitor parallel circuit) with improved sensitivity and linearity. The developed reader measures the power consumption of a Colpitts oscillator during a frequency sweep. The readout circuit consists of a Colpitts oscillator with a coil antenna, varactor diodes to change the oscillator frequency, analog circuitry to measure the power consumption and a microcontroller to control the whole system and send the data to a PC via USB. When an LC tag is inductively coupled to the oscillator, without contact, a maximum power peak is found. As shown by an experimental calibration using an LC tag made on FR4 substrate, the frequency of this maximum is related to the resonance frequency. Both parameters, power consumption and resonance frequency, present an excellent linear dependence with a high correlation factor (R 2 = 0.995). Finally, a screen-printed LC tag has been fabricated and used as relative humidity sensor achieving a sensitivity of (−2.41 ± 0.21) kHz/% with an R 2 of 0.946

    Spatially coupled generalized LDPC codes: asymptotic analysis and finite length scaling

    Get PDF
    Generalized low-density parity-check (GLDPC) codes are a class of LDPC codes in which the standard single parity check (SPC) constraints are replaced by constraints defined by a linear block code. These stronger constraints typically result in improved error floor performance, due to better minimum distance and trapping set properties, at a cost of some increased decoding complexity. In this paper, we study spatially coupled generalized low-density parity-check (SC-GLDPC) codes and present a comprehensive analysis of these codes, including: (1) an iterative decoding threshold analysis of SC-GLDPC code ensembles demonstrating capacity approaching thresholds via the threshold saturation effect; (2) an asymptotic analysis of the minimum distance and free distance properties of SC-GLDPC code ensembles, demonstrating that the ensembles are asymptotically good; and (3) an analysis of the finite-length scaling behavior of both GLDPC block codes and SC-GLDPC codes based on a peeling decoder (PD) operating on a binary erasure channel (BEC). Results are compared to GLDPC block codes, and the advantages and disadvantages of SC-GLDPC codes are discussed.This work was supported in part by the National Science Foundation under Grant ECCS-1710920, Grant OIA-1757207, and Grant HRD-1914635; in part by the European Research Council (ERC) through the European Union's Horizon 2020 research and innovation program under Grant 714161; and in part by the Spanish Ministry of Science, Innovation and University under Grant TEC2016-78434-C3-3-R (AEI/FEDER, EU)

    Probabilistic MIMO symbol detection with expectation consistency approximate inference

    Get PDF
    In this paper, we explore low-complexity probabilistic algorithms for soft symbol detection in high-dimensional multiple-input multiple-output (MIMO) systems. We present a novel algorithm based on the expectation consistency (EC) framework, which describes the approximate inference problem as an optimization over a nonconvex function. EC generalizes algorithms such as belief propagation and expectation propagation. For the MIMO symbol detection problem, we discuss feasible methods to find stationary points of the EC function and explore their tradeoffs between accuracy and speed of convergence. The accuracy is studied, first in terms of input-output mutual information and show that the proposed EC MIMO detector greatly improves state-of-the-art methods, with a complexity order cubic in the number of transmitting antennas. Second, these gains are corroborated by combining the probabilistic output of the EC detector with a low-density parity-check channel code.This work has been partly supported by the Ministerio de Economía of Spain jointly with the European Commission (ERDF) under projects MIMOTEX (TEC2014-61776-EXP), CIES (RTC-2015-4213-7), ELISA (TEC2014-59255-C3-3R), FLUID (TEC2016-78434-C3-3-R) and CAIMAN (TEC2017-86921-C2-2-R), by the Juan de la Cierva program (IJCI-2014-19150), and by Comunidad de Madrid (project “CASI-CAM-CM" id. S2013/ICE-2845).Publicad
    • …
    corecore